ASSESSMENT
SUMMARY
NUMBER
DATE
Phishing Campaign Assessment
«RVA-Number»
«Report-Date»
«Org-Name» («Acronym»)
Phishing Campaign Assessment
«RVA-Number»
«Report-Date»
Contents
1 Executive Report 4
2 «Acronym» Methodology Specifics 5
3 Summary of Testing Activities and Results 5
3.1 Click Rate vs. Report Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
3.2 Click Time vs. Report Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.3 Deception Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4 Open-Source Information Gathering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4 Closing 11
Appendix A: Methodology 13
Pre-planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Appendix B: Detailed Results 16
Appendix C: Templates 24
Deception Calculator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Engagement Specific Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Appendix D: Landing/Re-direct Page 34
Appendix E: Top Phishing Techniques & Scams 35
Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Scams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Appendix F: Acronyms 37
1
List of Figures
1 Unique user click rate vs. report rate per level of deception . . . . . . . . . . . . . . . . . . . . . 8
2 Timeline of unique user clicks across all levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3 Time to first click and report (HH:MM:SS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4 Click rates based on deception indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
5 Unique click, total click, and report results by level . . . . . . . . . . . . . . . . . . . . . . . . . 17
6 Breakdown of multiple clicks by level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
7 Clicking user timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
8 Count of unique clicks by “Office” per level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
9 Percentage of “Office” that clicked by level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
10 Count of unique clicks by level per “Office” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
11 Percentage of total unique clicks belonging to each “Office” by level . . . . . . . . . . . . . . . . 23
List of Tables
1 Targeted user measurements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2 Key finding recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3 Email template overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
4 Unique user click rate and report rate results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
5 Click time vs. report time (HH:MM:SS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
6 Email reconnaissance results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
7 Sample schedule outlines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
8 Weekly click and report results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
9 Event log . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
10 Unique clicks per “Office” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
11 Percentage of unique clicks per level by office . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
12 Count of unique clicks per level by office . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
13 Example Phishing email template deception calculator . . . . . . . . . . . . . . . . . . . . . . . 25
14 Phishing email template deception calculator . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
15 Email reconnaissance source descriptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
2
Assessment Details
Phishing Campaign Assessment Details
Customer Name «Org-Name» («Acronym»)
Customer POC «POC-Name», «POC-Email»
CCA Team Lead «FED-Name», «FED-Email»
Dates «Start-Date» to «End-Date»
Test Location CISA/CCA Lab, Arlington, VA
Scope
«Num-Users» users within the following domain:
«Target-Domain»
Services Phishing Campaign Assessment
CCA ID «RVA-Number»
Page | 3 of 37
1 Executive Report
The Cybersecurity and Infrastructure Security Agency (CISA) Cyber Assessments (CCA) team produced this
report for «Org-Name» («Acronym») in support of a Phishing Campaign Assessment (PCA) conducted «Start-Date»
to «End-Date». CCA conducts PCAs, upon customer request, to measure an organization’s propensity to click on
email phishing lures. PCA is a practical exercise intended to support and measure the effectiveness of security
awareness training for information system users. The results of this PCA show the potential susceptibility of
«Acronym» personnel to social engineering attacks—specifically email phishing attacks—in which an adversary
tricks an email user into clicking a malicious link to gain unauthorized network access. The assessment’s
goal was to isolate the human behavioral response to these types of attacks and no exploits were used during
the assessment. The assessment operated under the scenario that technical controls were unable to detect,
report, or stop the email phishing attempts from reaching the end user. CCA measured «Acronym»’s level of
susceptibility to a phishing attack by using targeted user click rates, click times, response rates, and response
times, as shown in table 1. This report aims to enhance «Acronym»’s understanding of their information system
users’ cybersecurity behavior and to promote a secure and resilient workforce.
Table 1: Targeted user measurements
User Activity Metrics Results
Total users targeted for phishing «Num-Users»
# of emails (phishing attempts) sent overall «Sum-Emails-Sent» («Num-Email-Per-
User» per user)
# of clicked emails (successful phishing attempts) overall
1
«Sum-Unique-Clicks» («Click-Rate»%
click rate )
# of phished users overall
2
«Unique-User-Clicks» («Unique-User-
Click-PercentPop»% of target popula-
tion)
# of user reports sent to helpdesk overall
3
«Total-User-Reports» («Report-Rate» re-
port rate )
# Ratio of reports-to-clicks «Report-Ratio»
Average time to first click
45
«Ave-Time-First-Click»
Average time to first report
6
«Ave-Time-First-Report»
Most clicked phishing template «Most-Successful»
CCA has provided this PCA to «Acronym» at no cost and coordinated all activities including planning and
testing with «Acronym»’s point of contact (POC). «Acronym» maintained control over the testing, including pro-
viding target email addresses, approving phishing email templates, approving testing timeframes, and adjusting
mail security setting to ensure inbox access. This PCA was not intended to, and did not, test technical controls or
electronic protections designed to block phishing attempts. This PCA spanned a six-week period and aimed to
capture metrics of «Acronym» information system users reaction to phishing emails of multiple deception levels.
1
Click rate is the total number of emails with “malicious” links that users clicked on divided by the total number of emails sent.
2
There were «Unique-User-Clicks» users who clicked a “malicious” link at least once by the end of the PCA out of the «Num-Users»
that were phished. There were «User-Multi-Campaign» users who clicked in more than one campaign.
3
Reporting rate is total number of user reports to the helpdesk divided by total phishing emails sent.
4
Average time to first click and first report is calculated with geometric mean to compensate for a small sample size that is sensitive
to being skewed by outliers
5
Click time is the time CCA sent the emails minus the time the user clicked the link within the email.
6
Average time to first click and first report is calculated with geometric mean to compensate for a small sample size that is sensitive
to being skewed by outliers
Page | 4 of 37
CCA has established this voluntary service to help organizations
support security awareness training efforts and
decrease information system userbase susceptibility and vulnerability to phishing attempts.
Based on the assessment data, CCA recommends implementing the key finding recommendations listed in
table 2.
Table 2: Key finding recommendations
Key Finding Recommendations
«Finding-1»
«Finding-2»
«Finding-3»
The remainder of this report provides findings and metrics of the CCA phishing service for «Acronym».
2 «Acronym» Methodology Specifics
CCA used the methodology documented in Appendix A: Methodology to perform testing of «Acronym». Spe-
cific phishing templates can be reviewed in Appendix C: Templates with their corresponding deception calcula-
tions. «Group-String» See table 3 for descriptions of email levels 1–6.
The assessment’s goal was to capture the behavior-based responses of «Acronym» to email phishing at-
tempts; and the assessment operated under the scenario that no technical controls were able to detect, report,
or stop the email phishing attempts. CCA, as an external body conducting this PCA, did not have direct knowledge
whether or not emails successfully arrived in targeted «Acronym» user inboxes. To allow for a clear determination
of click rates (email links clicked divided by emails sent), «Acronym» or CCA performed the following:
3 Summary of Testing Activities and Results
This PCA concentrated on how phishing email deception affected «Acronym» user click behavior and re-
sponse behavior. CCA expects that a more deceptive phishing email has a higher likelihood of being clicked
and a lower likelihood of being reported. This assessment also gathered organizational emails through open-
source information techniques to determine «Acronym»’s potentially attackable online presence. See Appendix
B: Detailed Results for detailed phishing results and Appendix C: Templates for a detailed explanation on de-
ception levels.
Over six weeks, «Acronym»’s targeted users received at least «Emails-Per-User» phishing emails of increas-
ing deception. Levels 1–3 were “easier to detect” and levels 4–6 were “more difficult to detect,” based on the
number and type of indicators used. The table below summarizes the phishing templates used in this PCA.
7
Page | 5 of 37
Table 3: Email template overview
Level Campaign Subject Template Description Link Displayed
1 «Level-1-Subject» «Level-1-Description»
«Level-1-Link-Type»:
«Level-1-Display-Link»
«Level-1-Hover-Over» «Level-1-Url»
2 «Level-2-Subject» «Level-2-Description»
«Level-2-Link-Type»:
«Level-2-Display-Link»
«Level-2-Hover-Over» «Level-2-Url»
3 «Level-3-Subject» «Level-3-Description»
«Level-3-Link-Type»:
«Level-3-Display-Link»
«Level-3-Hover-Over» «Level-3-Url»
4 «Level-4-Subject» «Level-4-Description»
«Level-4-Link-Type»:
«Level-4-Display-Link»
«Level-4-Hover-Over» «Level-4-Url»
5 «Level-5-Subject» «Level-5-Description»
«Level-5-Link-Type»:
«Level-5-Display-Link»
«Level-5-Hover-Over» «Level-5-Url»
6 «Level-6-Subject» «Level-6-Description»
«Level-6-Link-Type»:
«Level-6-Display-Link»
«Level-6-Hover-Over» «Level-6-Url»
3.1 Click Rate vs. Report Rate
One of the most common measures of an organization’s susceptibility to phishing attacks is targeted-user
click rates. If an actual phishing attack is able to bypass technical controls and arrive in a user’s inbox, the
human target must click on the malicious link or attachment for the attack to be successful. In practical exer-
cises such as this PCA, targeted-user click rates will fluctuate depending on the level of email deception used
in testing. Effective security awareness training, however, should result in a noticeable click rate decrease over
time and to a level deemed acceptable based on the organization’s risk management posture.
The counterpart to user click rates is the user-reporting rate, determined by the number of emails sent or
calls to alert «Acronym»’s helpdesk during each campaign. Based on previous testing, CCA recommends that
organizations initially aim to have two people reporting the phishing attempts for every person that clicks. This
ratio ensures that there is not only reporting coverage for the person clicking, but also redundant coverage
in case the person who clicks the link does not report or does not realize they have been phished. Effective
security awareness training should result in a noticeable report rate increase over time and to a level deemed
acceptable based on the organization’s risk management posture.
In this report, the percentages shown by the user click rate and user report rate represent, respectively,
the percentage of targeted people (determined by a unique email address) who have clicked at least once on
a “malicious” link and those who reported a suspicious email. The number of unique clicks correlates to the
number of end user devices potentially compromised in each campaign. The number of user reports correlates
to the number of opportunities the «Acronym» security team had to identify a potential breach and reduce its
7
A displayed link of type “Hidden” means hyperlinked text was used instead of a bare “Written Out” URL. A displayed link type of
“Spoofed” means a fake URL or filename was used that would not match the “Hover Over” URL or file information. A displayed link type
of Written Out” always matches the “Hover Over” URL.
Page | 6 of 37
impact. Table 4 and figure 1 summarize click and report rates.
Table 4: Unique user click rate and report rate results
Level Campaign Subject
User
Click
Rate
Unique
Clicks
User
Report
Rate
User
Reports
Reporting
Ratio
1 «Level-1-Subject»
«Level-1-
Click-
Rate»%
«Level-1-
User-Clicks»
«Level-1-
User-
Report-
Rate»
«Level-1-
User-
Reports»
«Level-1-
Report-
Ratio»
2 «Level-2-Subject»
«Level-2-
Click-
Rate»%
«Level-2-
User-Clicks»
«Level-2-
User-
Report-
Rate»
«Level-2-
User-
Reports»
«Level-2-
Report-
Ratio»
3 «Level-3-Subject»
«Level-3-
Click-
Rate»%
«Level-3-
User-Clicks»
«Level-3-
User-
Report-
Rate»
«Level-3-
User-
Reports»
«Level-3-
Report-
Ratio»
4 «Level-4-Subject»
«Level-4-
Click-
Rate»%
«Level-4-
User-Clicks»
«Level-4-
User-
Report-
Rate»
«Level-4-
User-
Reports»
«Level-4-
Report-
Ratio»
5 «Level-5-Subject»
«Level-5-
Click-
Rate»%
«Level-5-
User-Clicks»
«Level-5-
User-
Report-
Rate»
«Level-5-
User-
Reports»
«Level-5-
Report-
Ratio»
6 «Level-6-Subject»
«Level-6-
Click-
Rate»%
«Level-6-
User-Clicks»
«Level-6-
User-
Report-
Rate»
«Level-6-
User-
Reports»
«Level-6-
Report-
Ratio»
Page | 7 of 37
Figure 1: Unique user click rate vs. report rate per level of deception
3.2 Click Time vs. Report Time
To contain an attack, an organization’s security team must be made aware of the potential breach. In the
scenario that this PCA tested, no technical controls were triggered during a phishing attack and the clock for
a potential breach started once a targeted user clicked on a “malicious” link. Timely user reporting decreases
the window of opportunity that an adversary has to access data or gain further network entry. Timely reporting
also increases the opportunity the security team has to detect and respond to a potential breach. By educat-
ing users on how to both spot and promptly respond to phishing attempts, an organization can improve their
anti-phishing defenses. The following table details the time to first click and first report throughout the assess-
ment and the lead or lag times for incident response measures to be activated (time elapsed represented in
hours:minutes:seconds).
Table 5: Click time vs. report time (HH:MM:SS)
Level Time to First Click Time to First Report Time Gap (Lead or Lag)
1
«Level-1-Time-To-First-Click» «Level-1-Time-To-First-Report» «Level-1-Time-Gap» «Level-1-
Gap-Type»
2
«Level-2-Time-To-First-Click» «Level-2-Time-To-First-Report» «Level-2-Time-Gap» «Level-2-
Gap-Type»
3
«Level-3-Time-To-First-Click» «Level-3-Time-To-First-Report» «Level-3-Time-Gap» «Level-3-
Gap-Type»
4
«Level-4-Time-To-First-Click» «Level-4-Time-To-First-Report» «Level-4-Time-Gap» «Level-4-
Gap-Type»
5
«Level-5-Time-To-First-Click» «Level-5-Time-To-First-Report» «Level-5-Time-Gap» «Level-5-
Gap-Type»
6
«Level-6-Time-To-First-Click» «Level-6-Time-To-First-Report» «Level-6-Time-Gap» «Level-6-
Gap-Type»
Page | 8 of 37
The figure below shows the percentage of users who clicked during certain time intervals in the first 24 hours
of a campaign. Overall, nearly «Unique-Click-Timeline-60-min» percent of all clicks occurred within one hour of
receiving a phishing email. The median time to click was «Median-Time-To-Click» for the entire assessment.
Figure 2: Timeline of unique user clicks across all levels
Figure 3 shows the amount of time for the first user to click on a link in an email from the time it was sent
(elapsed time displayed as HH:MM:SS).
Figure 3: Time to first click and report (HH:MM:SS)
Page | 9 of 37
3.3 Deception Indicators
Figure 4 shows the percentage of clicks for each tested deception category and indicator. For example, in
the behavior category, the fear indicator had a «complexity-behavior-fear-Click-Rat percent unique click rate
or, «complexity-behavior-fear-Click-Rate» percent of all emails sent with a fear indicator were clicked at least
once. In the sender category, «complexity-sender-internal-Generic-Click-Rate» percent of all emails with a generic
internal sender indicator were clicked at least once.
Figure 4: Click rates based on deception indicators
The vertical red line indicates the click rate («Click-Rate») for the overall assessment. Indicators with a
click rate higher than the average show what type of low deception indicators (red flags) were ignored more
Page | 10 of 37
often than others or what type of high deception indicators (sophisticated techniques) were more effective than
others in convincing «Acronym» personnel to click. No direct causal link can be drawn between indicator and
click, as each email had a combination or lack of various indicators. This information, however, can help guide
anti-phishing training and awareness by showing what level of deception indicators and categories users have
shown a greater tendency to disregard or believe.
3.4 Open-Source Information Gathering
For this PCA, CCA looked for «Acronym» email addresses publicly available online. The following table is a
summary of the email addresses discovered through passive reconnaissance and collected using open-source
information gathering tools. The emails CCA discovered were not used in this engagement unless previously
provided by the technical POC.
Table 6: Email reconnaissance results
Item Result
Email domain searched «Target-Domain»
Date search performed
«Email-Recon-Date-
Time-Search»
# Unique email addresses found
«Email-Recon-Total-
Emails-Discovered»
# Matching list of user emails provided by «Acronym»
«Email-Recon-Email-
List-Match»
(«Email-Recon-Match-
Rate»% of total target
list)
Based on previous testing, most of the email addresses discovered during passive reconnaissance are
sourced from organizational documents and presentations shared online. To limit the exposure of exploitable
organizational information, CCA recommends that employee names and emails be limited in use on websites
and in reports or presentations stored on the public internet. When announcing new products or updating online
registration, CCA also recommends that organizations use generic distribution email address as opposed to
specific employee names.
4 Closing
Under this PCA’s attack scenario, CCA measured «Acronym» information system user susceptibility to a suc-
cessful email phishing attack in the case where technical controls failed to detect, report, or stop the attack.
Targeted-userbase vulnerability was measured through four key metrics:
click rate (number of email links clicked divided by number of emails sent),
click time (the time user clicked “malicious” link minus the time CCA sent the phishing emails),
report rate (number of user alerts to security office or helpdesk of phishing attempt divided by number of
emails sent), and
report time the time user alerts security office or helpdesk minus the time CCA sent the phishing emails).
Page | 11 of 37
To collect these metrics, CCA sent six emails of varying deception levels to the email list provided by «Acronym».
The overall click rate was «Click-Rat percent and the overall reporting rate was «Report-Rate» percent when
analyzing all emails clicked and reports submitted across all levels of deception.
<<Detailed_Conclusion>>
CCA appreciates any comments to improve this report or service as a whole. For questions about this report
or for future engagements with CCA, please send an email to NCATS@hq.dhs.gov
Page | 12 of 37
Appendix A: Methodology
The Phishing Campaign Assessment (PCA) measures the susceptibility of an organization’s personnel to
social engineering attacks, specifically email phishing attacks. During the phishing assessment, click rate and
reporting rate metrics are gathered and no malicious payloads are actually sent. This assessment is conducted
in four phases across approximately six weeks:
1. Pre-planning
2. Planning
3. Testing
4. Reporting
Pre-planning
During the pre-planning phase, CCA offers a capability brief of all CCA services, including PCAs. In addition,
a Service Request Form (SRF) and Rules of Engagement (ROE) document are provided, of which the ROE is the
legal agreement for testing between CCA and the requesting organization. The customer can request a PCA
by submitting a completed SRF and ROE. Note: The ROE appendices are not required to be completed at the
time of submission, and an organization only needs one signed ROE on file with CCA to request additional CCA
assessments.
Planning
Once a completed ROE is signed, a testing timeframe is established with the customer. A CCA lead engineer
is assigned to the test and works directly with the customer POC for the remainder of the assessment. Addi-
tionally, CCA holds a planning meeting to provide the customer with details of the service and to obtain testing
specifics. This meeting is an engineering level meeting that includes conversations on the following items:
Signed rules of engagement
Appendix A (start/end date, technical POC)
Template approval
Redirect page determination
Phishing template process
Draft templates
List of individual email addresses
Redirect page determination
Preparation week (the week prior to testing)
Send one to five test emails
Whitelisting activities if needed
Page | 13 of 37
In addition, Appendix A of the ROE is completed to define the scope of this single PCA.
Next, the CCA federal lead finalizes the start date to begin preparation and testing activities. The customer
approves the schedule and provides a list of email addresses to be used throughout the phishing assessment.
Emails may be divided into two or more groups and email delivery times may vary to help obfuscate PCA activi-
ties. The table below outlines a sample schedule:
Table 7: Sample schedule outlines
Week Template Email Group Day Time
1 Level 1 1 Monday 9 a.m.
2 Level 2 2 Wednesday 12 p.m.
3 Level 3 3 Thursday 9 a.m.
4 Level 4 4 Tuesday 2 p.m.
5 Level 5 2 Friday 9 a.m.
6 Level 6 3 Friday 9 a.m.
The phishing templates used throughout the assessment vary in six levels of deception. The deception of
each template is scored based on rankings in the following categories: (1) Appearance, (2) Sender, (3) Rel-
evancy, and (4) Behavior. The CCA engineer may provides multiple templates per level for the customer to
approve. The customer is asked to review each template, modify it if desired, and select one template to be
used each week.
All testing activities are conducted from CISA testing facilities external to the customer’s site.
The week prior to the start of the assessment, CCA engineers work with the customer’s technical POC to
determine if test emails are successfully delivered to a user’s inbox. If an email is not delivered, CCA requests
that the customer whitelist the CCA network and domain name(s). The goal of the PCA is to capture click rate
metrics and, although testing may identify technological strengths and weaknesses, this service is not intended
to test an organization’s technical controls against phishing attempts.
Testing
During the testing phase, CCA will generate and send a phishing email to a targeted list of email addresses
provided and agreed upon by the customer. The customer is asked to provide a list between 100–7500 email
addresses to ensure enough data is collected throughout the assessment. Each user typically receives a mini-
mum of two emails during the entire assessment. Each email sent varies in deception; the first email is typically
a lower-ranking level of deception (three or lower), whereas the second email is a high-ranking level of deception
(four or higher).
Within the email, a user is enticed to click on a “malicious” link. When clicked, the link redirects a user
to a page of the customer’s choice. CCA is able to track the percentage of users that click the link, providing
insight into the effectiveness of any existing security awareness programs and measuring the susceptibility of
an attack from this vector. Metrics are collected for a minimum of three business days, and typically up to five
business days, depending on click rate frequency.
Throughout each assessment week, a CCA engineer coordinates all activities, including phishing template
creation, campaign launch times, and campaign completion. Additionally, if the customer desires, weekly results
Page | 14 of 37
can be provided that include preliminary metrics on campaign status, number of emails sent, email deception,
click rate, and number of clicks.
Reporting
During the reporting phase, the CCA team begins report writing as the customer provides response rate
metrics. The main section of the PCA report is currently manually generated, where the main section is tailored
to business executives and the appendices are intended for managers and engineers. The report shows how
susceptible the Organization’s users are to phishing emails in terms of users who click, as well as those who
report. The report provides data and statistics, but not business analysis. The report does not show attribution
to users that clicked or did not click a potentially malicious link.
The report is only sent to the designated POC and any inquiries about the report from a third party are
directed to the POC for response. Each PCA is considered complete with the delivery of the final report.
Page | 15 of 37
Appendix B: Detailed Results
Appendix B is a listing of detailed results collected throughout testing. Table 8 lists weekly click rates cap-
tured and report rates «Acronym» collected and submitted to CCA throughout testing.
Table 8: Weekly click and report results
Week Campaign Subject
Email
Sent
Total
Clicks
User
Clicks
Click
Rate
User
Reports
Report
Rate
1 «Level-1-Subject»
«Level-1-
Emails-
Sent-
Attempted»
«Level-1-
Total-
Clicks»
«Level-1-
User-
Clicks»
«Level-1-
Click-
Rate»%
«Level-1-
User-
Reports»
«Level-1-
User-
Report-
Rate»
2 «Level-2-Subject»
«Level-2-
Emails-
Sent-
Attempted»
«Level-2-
Total-
Clicks»
«Level-2-
User-
Clicks»
«Level-2-
Click-
Rate»%
«Level-2-
User-
Reports»
«Level-2-
User-
Report-
Rate»
3 «Level-3-Subject»
«Level-3-
Emails-
Sent-
Attempted»
«Level-3-
Total-
Clicks»
«Level-3-
User-
Clicks»
«Level-3-
Click-
Rate»%
«Level-3-
User-
Reports»
«Level-3-
User-
Report-
Rate»
4 «Level-4-Subject»
«Level-4-
Emails-
Sent-
Attempted»
«Level-4-
Total-
Clicks»
«Level-4-
User-
Clicks»
«Level-4-
Click-
Rate»%
«Level-4-
User-
Reports»
«Level-4-
User-
Report-
Rate»
5 «Level-5-Subject»
«Level-5-
Emails-
Sent-
Attempted»
«Level-5-
Total-
Clicks»
«Level-5-
User-
Clicks»
«Level-5-
Click-
Rate»%
«Level-5-
User-
Reports»
«Level-5-
User-
Report-
Rate»
6 «Level-6-Subject»
«Level-6-
Emails-
Sent-
Attempted»
«Level-6-
Total-
Clicks»
«Level-6-
User-
Clicks»
«Level-6-
Click-
Rate»%
«Level-6-
User-
Reports»
«Level-6-
User-
Report-
Rate»
Figure 5 shows the comparison of unique clicks, total clicks, and reports per level of deception.
Page | 16 of 37
Figure 5: Unique click, total click, and report results by level
Figure 6 shows the number of users who clicked once, 2–3 times, 4–5 times, 6–10 times, and more than
10 times per campaign, broken down by deception level. Users who click more than once are creating multiple
connect-and-control opportunities for the adversary.
Figure 6: Breakdown of multiple clicks by level
Table 9 shows the time-related event log throughout testing.
Figure 7 shows the percentage of clicking users who had clicked by certain time intervals for each of the six
deception levels.
Page | 17 of 37
Table 9: Event log
Week Campaign Subject
Date
Sent(ET)
Time to First
Click
(HH:MM:SS)
Median Time
to Click
(HH:MM:SS)
Length of
Campaign
(Days)
1 «Level-1-Subject»
«Level-1-
Start-Date»
«Level-1-Time-
To-First-Click»
«Level-1-
Median-Time-To-
Click»
«Level-1-
Length»
2 «Level-2-Subject»
«Level-2-
Start-Date»
«Level-2-Time-
To-First-Click»
«Level-2-
Median-Time-To-
Click»
«Level-2-
Length»
3 «Level-3-Subject»
«Level-3-
Start-Date»
«Level-3-Time-
To-First-Click»
«Level-3-
Median-Time-To-
Click»
«Level-3-
Length»
4 «Level-4-Subject»
«Level-4-
Start-Date»
«Level-4-Time-
To-First-Click»
«Level-4-
Median-Time-To-
Click»
«Level-4-
Length»
5 «Level-5-Subject»
«Level-5-
Start-Date»
«Level-5-Time-
To-First-Click»
«Level-5-
Median-Time-To-
Click»
«Level-5-
Length»
6 «Level-6-Subject»
«Level-6-
Start-Date»
«Level-6-Time-
To-First-Click»
«Level-6-
Median-Time-To-
Click»
«Level-6-
Length»
Figure 7: Clicking user timeline
Table 10 shows the number and percentage of unique and total email clicks by the top five clicking “Office”
designations provided by «Acronym» with 10 or more users. All other offices are listed in the “Other” category.
Percentage of office that clicked is based on the office count. Percentage of all unique clicks is based on the
unique clicks gathered for the whole assessment. Office unique click rate overall is based on total emails sent
to that office. For example, «Labels-1-Name» clicked on «Labels-1-Click-Rate» percent of all emails sent to that
Page | 18 of 37
office. Additionally, «Labels-1-Percent-Of-All-Clicks» percent of all unique clicks gathered during the PCA came
from «Labels-1-Name».
Table 10: Unique clicks per “Office”
Office Count
Total
Emails
Sent
Unique
Clicks
Total
Clicks
Percent of
Office that
Clicked at least
Once
Percent of All
Unique Clicks
from Office
Office Unique
Click Rate
Overall
«Labels-1-
Name»
«Labels-
1-
Total-
Users»
«Labels-
1-Total-
Sent»
«Labels-
1-
Unique-
Clicks»
«Labels-
1-Total-
Clicks»
«Labels-1-Phish-
Rate»%
«Labels-1-
Percent-Of-All-
Clicks»%
«Labels-1-Click-
Rate»%
«Labels-2-
Name»
«Labels-
2-
Total-
Users»
«Labels-
2-Total-
Sent»
«Labels-
2-
Unique-
Clicks»
«Labels-
2-Total-
Clicks»
«Labels-2-Phish-
Rate»%
«Labels-2-
Percent-Of-All-
Clicks»%
«Labels-2-Click-
Rate»%
«Labels-3-
Name»
«Labels-
3-
Total-
Users»
«Labels-
3-Total-
Sent»
«Labels-
3-
Unique-
Clicks»
«Labels-
3-Total-
Clicks»
«Labels-3-Phish-
Rate»%
«Labels-3-
Percent-Of-All-
Clicks»%
«Labels-3-Click-
Rate»%
«Labels-4-
Name»
«Labels-
4-
Total-
Users»
«Labels-
4-Total-
Sent»
«Labels-
4-
Unique-
Clicks»
«Labels-
4-Total-
Clicks»
«Labels-4-Phish-
Rate»%
«Labels-4-
Percent-Of-All-
Clicks»%
«Labels-4-Click-
Rate»%
«Labels-5-
Name»
«Labels-
5-
Total-
Users»
«Labels-
5-Total-
Sent»
«Labels-
5-
Unique-
Clicks»
«Labels-
5-Total-
Clicks»
«Labels-5-Phish-
Rate»%
«Labels-5-
Percent-Of-All-
Clicks»%
«Labels-5-Click-
Rate»%
«Labels-
Other-
Name»
«Labels-
Other-
Total-
Users»
«Labels-
Other-
Total-
Sent»
«Labels-
Other-
Unique-
Clicks»
«Labels-
Other-
Total-
Clicks»
«Labels-Other-
Phish-Rate»%
«Labels-Other-
Percent-Of-All-
Clicks»%
«Labels-Other-
Click-Rate»%
Figure 8 shows the numbers of clicks for each level belonging to the different “Office” designations.
Page | 19 of 37
Figure 8: Count of unique clicks by “Office” per level
Figure 9 shows the percentage of each “Office” that clicked by level. The percentage is relative to individual
“Office” totals and not a percentage of the whole of «Acrony»». For example, «Labels-1-1-Click-Rate» percent of
«Labels-1-Name» clicked on level 1.
Figure 9: Percentage of “Office” that clicked by level
Tables 11 and 12 shows the number of unique clicks (and percentage of total unique clicks) by deception
level per “Office.” For example, «Labels-1-1-Percent-Of-All-Clicks» percent of all clicks in Level 1 came from the
«Labels-1-Name».
Page | 20 of 37
Table 11: Percentage of unique clicks per level by office
Level
Emails
Sent
Unique
Clicks
Total
Clicks
«Labels-1-
Name»
«Labels-2-
Name»
«Labels-3-
Name»
«Labels-4-
Name»
«Labels-5-
Name»
Other
1
«Level-1-
Emails-
Sent-
Attempted»
«Level-1-
User-
Clicks»
«Level-1-
Total-
Clicks»
«Labels-
1-1-
Percent-
Of-All-
Clicks»%
«Labels-
2-1-
Percent-
Of-All-
Clicks»%
«Labels-
3-1-
Percent-
Of-All-
Clicks»%
«Labels-
4-1-
Percent-
Of-All-
Clicks»%
«Labels-
5-1-
Percent-
Of-All-
Clicks»%
«Labels-
Other-1-
Percent-
Of-All-
Clicks»%
2
«Level-2-
Emails-
Sent-
Attempted»
«Level-2-
User-
Clicks»
«Level-2-
Total-
Clicks»
«Labels-
1-2-
Percent-
Of-All-
Clicks»%
«Labels-
2-2-
Percent-
Of-All-
Clicks»%
«Labels-
3-2-
Percent-
Of-All-
Clicks»%
«Labels-
4-2-
Percent-
Of-All-
Clicks»%
«Labels-
5-2-
Percent-
Of-All-
Clicks»%
«Labels-
Other-2-
Percent-
Of-All-
Clicks»%
3
«Level-3-
Emails-
Sent-
Attempted»
«Level-3-
User-
Clicks»
«Level-3-
Total-
Clicks»
«Labels-
1-3-
Percent-
Of-All-
Clicks»%
«Labels-
2-3-
Percent-
Of-All-
Clicks»%
«Labels-
3-3-
Percent-
Of-All-
Clicks»%
«Labels-
4-3-
Percent-
Of-All-
Clicks»%
«Labels-
5-3-
Percent-
Of-All-
Clicks»%
«Labels-
Other-3-
Percent-
Of-All-
Clicks»%
4
«Level-4-
Emails-
Sent-
Attempted»
«Level-4-
User-
Clicks»
«Level-4-
Total-
Clicks»
«Labels-
1-4-
Percent-
Of-All-
Clicks»%
«Labels-
2-4-
Percent-
Of-All-
Clicks»%
«Labels-
3-4-
Percent-
Of-All-
Clicks»%
«Labels-
4-4-
Percent-
Of-All-
Clicks»%
«Labels-
5-4-
Percent-
Of-All-
Clicks»%
«Labels-
Other-4-
Percent-
Of-All-
Clicks»%
5
«Level-5-
Emails-
Sent-
Attempted»
«Level-5-
User-
Clicks»
«Level-5-
Total-
Clicks»
«Labels-
1-5-
Percent-
Of-All-
Clicks»%
«Labels-
2-5-
Percent-
Of-All-
Clicks»%
«Labels-
3-5-
Percent-
Of-All-
Clicks»%
«Labels-
4-5-
Percent-
Of-All-
Clicks»%
«Labels-
5-5-
Percent-
Of-All-
Clicks»%
«Labels-
Other-4-
Percent-
Of-All-
Clicks»%
6
«Level-6-
Emails-
Sent-
Attempted»
«Level-6-
User-
Clicks»
«Level-6-
Total-
Clicks»
«Labels-
1-6-
Percent-
Of-All-
Clicks»%
«Labels-
2-6-
Percent-
Of-All-
Clicks»%
«Labels-
3-6-
Percent-
Of-All-
Clicks»%
«Labels-
4-6-
Percent-
Of-All-
Clicks»%
«Labels-
5-6-
Percent-
Of-All-
Clicks»%
«Labels-
Other-6-
Percent-
Of-All-
Clicks»%
Page | 21 of 37
Table 12: Count of unique clicks per level by office
Level
Emails
Sent
Unique
Clicks
Total
Clicks
«Labels-1-
Name»
«Labels-2-
Name»
«Labels-3-
Name»
«Labels-4-
Name»
«Labels-5-
Name»
Other
1
«Level-1-
Emails-
Sent-
Attempted»
«Level-1-
User-
Clicks»
«Level-1-
Total-
Clicks»
«Labels-
1-1-
Unique-
Clicks»
«Labels-
2-1-
Unique-
Clicks»
«Labels-
3-1-
Unique-
Clicks»
«Labels-
4-1-
Unique-
Clicks»
«Labels-
5-1-
Unique-
Clicks»
«Labels-
Other-1-
Unique-
Clicks»
2
«Level-2-
Emails-
Sent-
Attempted»
«Level-2-
User-
Clicks»
«Level-2-
Total-
Clicks»
«Labels-
1-2-
Unique-
Clicks»
«Labels-
2-2-
Unique-
Clicks»
«Labels-
3-2-
Unique-
Clicks»
«Labels-
4-2-
Unique-
Clicks»
«Labels-
5-2-
Unique-
Clicks»
«Labels-
Other-2-
Unique-
Clicks»
3
«Level-3-
Emails-
Sent-
Attempted»
«Level-3-
User-
Clicks»
«Level-3-
Total-
Clicks»
«Labels-
1-3-
Unique-
Clicks»
«Labels-
2-3-
Unique-
Clicks»
«Labels-
3-3-
Unique-
Clicks»
«Labels-
4-3-
Unique-
Clicks»
«Labels-
5-3-
Unique-
Clicks»
«Labels-
Other-3-
Unique-
Clicks»
4
«Level-4-
Emails-
Sent-
Attempted»
«Level-4-
User-
Clicks»
«Level-4-
Total-
Clicks»
«Labels-
1-4-
Unique-
Clicks»
«Labels-
2-4-
Unique-
Clicks»
«Labels-
3-4-
Unique-
Clicks»
«Labels-
4-4-
Unique-
Clicks»
«Labels-
5-4-
Unique-
Clicks»
«Labels-
Other-4-
Unique-
Clicks»
5
«Level-5-
Emails-
Sent-
Attempted»
«Level-5-
User-
Clicks»
«Level-5-
Total-
Clicks»
«Labels-
1-5-
Unique-
Clicks»
«Labels-
2-5-
Unique-
Clicks»
«Labels-
3-5-
Unique-
Clicks»
«Labels-
4-5-
Unique-
Clicks»
«Labels-
5-5-
Unique-
Clicks»
«Labels-
Other-4-
Unique-
Clicks»
6
«Level-6-
Emails-
Sent-
Attempted»
«Level-6-
User-
Clicks»
«Level-6-
Total-
Clicks»
«Labels-
1-6-
Unique-
Clicks»
«Labels-
2-6-
Unique-
Clicks»
«Labels-
3-6-
Unique-
Clicks»
«Labels-
4-6-
Unique-
Clicks»
«Labels-
5-6-
Unique-
Clicks»
«Labels-
Other-6-
Unique-
Clicks»
Figure 10 shows the number of unique clicks for each “Office” by level of phishing deception.
Page | 22 of 37
Figure 10: Count of unique clicks by level per “Office”
Figure 11 shows the percentage of total unique clicks per “office” by level of phishing deception. The per-
centage is relative to the total unique clicks for that level.
Figure 11: Percentage of total unique clicks belonging to each “Office” by level
Page | 23 of 37
Appendix C: Templates
The phishing templates used throughout this service include six levels of varying deception. Each level is
based on a calculation of the factors designed to entice users to click on a malicious link. An explanation of the
four factors used when determining an email’s level of deception can be found in ”Deception Calculator”. The
following defines key differences between each level:
Level 1 - This level will be the easiest to detect and features common scams that are easily identified.
Templates will usually include low deception indicators such as bad grammar, fake domains, and unknown
external senders. Commonly the emails will appeal to emotions of greed, monetary gain, or guilt to elicit
a user’s response.
Level 2 These email templates also use low deception indicators in that they typically appear to be from
an external, unauthoritative sender. The templates use one higher deception indicator such as decent
or proper grammar and are written in a tone of urgency to help the email appear genuine. They also
commonly appeal to emotions of greed, monetary gain, or guilt.
Level 3 These are email scams that usually use two higher deception indicators such as appearing to
be from a legitimate external organization or spoofing a known internal office or job title, but the link is
usually fake and not associated with the content. Templates typically will include low deception indicators
such as bad grammar or display fake domains. The theme of the templates can appeal to emotions or
guilt and cause a sense of urgency if the request is not followed quickly.
Level 4 –The email templates use more high deception indicators than low deception and frequently ap-
pear to come from a commonly found internal department, such as “IT Support” or “Human Resources”.
The templates will request actions or information that real world departments would be trained not to ask
users over email such as their passwords, private information, or financial accounts. The email templates
will normally have decent grammar, but will be sent by a generic account internal with fake domains. The
template can appeal to sense of urgency and fear of disciplinary action to elicit a user’s click.
Level 5 This level includes seemingly common emails that might be sent from the respective organi-
zation and uses information gathered from publicly available sources to seem genuine. The templates
usually come from internal and authoritative senders (high deception indicators) but with fake domains
or generic information (lower deception indicators). The templates can appeal to a sense of fear and the
need to comply with authority to elicit a user’s response.
Level 6 This level’s templates will be the hardest to detect and will typically have the highest click rate.
These templates can appear to be from an internal office, personally address the recipient, and include
themes such as negative policy changes and responses to internal and external news that may negatively
affect employees. This level uses information gathered from public sources to develop the templates such
as names of organizational individuals or departments, any relevant news, or information about specific
organizational processes or tools.
Deception Calculator
The following calculator provides the method for determining the deception of each template (1 being not
deceptive and easy to detect and 6 being very deceptive and difficult to detect).
Page | 24 of 37
Table 13: Example Phishing email template deception calculator
Category Indicator Ranking Scale Ranking
Appearance
Grammar 0=Poor, 1=Decent, 2=Proper
Link Domain 0=Fake, 1=Spoofed/Hidden
Logo/Graphics
0=Fake/None,
1=Spoofed/HTML
Sender
External 0=Fake/NA, 1=Spoofed
Internal
0=Fake/NA, 1= Unknown
Spoofed, 2= Known Spoofed
Authoritative
0=None, 1=Corporate/Local,
2=Federal/State
Relevancy
Organization 0=No, 1=Yes
Public News 0=No, 1=Yes
Behavior
(Optional)
Fear No Score
Duty or Obligation No Score
Curiosity No Score
Greed No Score
Total
Appearance
Grammar Writing level of the content in the email
Poor indicates several misspellings and awkward use of the English language
Decent indicates few misspellings and few grammatical errors or is relatively short in length
Proper indicates one or no misspellings or grammatical errors and the email is longer/more formal
than the usual email
Link Domain Embedded link can be clearly fake or attempt to impersonate a real domain
Logo/Graphics Images can be either clearly fake or close imposters
Sender
External Sender and email domain are external to the users organization, but are clearly fake or look
like real external entities
Internal Sender and email domain appear to come from user’s organization, but are clearly fake or
spoof internal senders
Authoritative (Corporate/Local or Federal/State) Sender making a request or demand and
speaks from a position of power that could be associated with a Corporate/Local office or Federal/State
office
Relevancy
Organization Content is pertinent to organization’s current events, news, or uses the name/email of
the targeted user
Page | 25 of 37
Public News Content is pertinent to current events in the local area or nation
Behavior (Optional)
Fear Scareware or appeals to the emotion of fear within the theme of the email
Duty or Obligation Appeals to the sense of responsibility within the theme of the email
Curiosity Appeals to the desire to learn more within the theme of the email
Greed Appeals to greed and monetary gain within the theme of the email
Engagement Specific Templates
Below are the details about the templates used during PCA with «Acronym». The deception indicators are
shown in the table below. In addition, a text version of each email template is provided along with key character-
istics pertaining to each template which help describe the red flags (“RF”) and sophisticated techniques (“ST”)
used to rank each template. Basic statistics for each campaign are also provided below the key characteristics.
Page | 26 of 37
Deception Calculator
Table 14: Phishing email template deception calculator
Category Indicator Ranking Scale
«Level-1-Subject»
«Level-2-Subject»
«Level-3-Subject»
«Level-4-Subject»
«Level-5-Subject»
«Level-6-Subject»
Appearance
Grammar
0=Poor, 1=Decent,
2=Proper
«Level-
1-
Complexity-
Grammar»
«Level-
2-
Complexity-
Grammar»
«Level-
3-
Complexity-
Grammar»
«Level-
4-
Complexity-
Grammar»
«Level-
5-
Complexity-
Grammar»
«Level-
6-
Complexity-
Grammar»
Link Domain
0=Fake,
1=Spoofed/Hidden
«Level-
1-
Complexity-
Link-
Domain»
«Level-
2-
Complexity-
Link-
Domain»
«Level-
3-
Complexity-
Link-
Domain»
«Level-
4-
Complexity-
Link-
Domain»
«Level-
5-
Complexity-
Link-
Domain»
«Level-
6-
Complexity-
Link-
Domain»
Logo/Graphics
0=Fake/None,
1=Spoofed/HTML
«Level-
1-
Complexity-
Logo-
Graphics»
«Level-
2-
Complexity-
Logo-
Graphics»
«Level-
3-
Complexity-
Logo-
Graphics»
«Level-
4-
Complexity-
Logo-
Graphics»
«Level-
5-
Complexity-
Logo-
Graphics»
«Level-
6-
Complexity-
Logo-
Graphics»
Sender
External 0=Fake/NA, 1=Spoofed
«Level-
1-
Complexity-
External»
«Level-
2-
Complexity-
External»
«Level-
3-
Complexity-
External»
«Level-
4-
Complexity-
External»
«Level-
5-
Complexity-
External»
«Level-
6-
Complexity-
External»
Internal
0=Fake/NA, 1=
Unknown Spoofed, 2=
Known Spoofed
«Level-
1-
Complexity-
Internal»
«Level-
2-
Complexity-
Internal»
«Level-
3-
Complexity-
Internal»
«Level-
4-
Complexity-
Internal»
«Level-
5-
Complexity-
Internal»
«Level-
6-
Complexity-
Internal»
Authoritative
0=None,
1=Corporate/Local,
2=Federal/State
«Level-
1-
Complexity-
Authoritativ
«Level-
2-
Complexity-
Authoritativ
«Level-
3-
Complexity-
Authoritativ
«Level-
4-
Complexity-
Authoritativ
«Level-
5-
Complexity-
Authoritativ
«Level-
6-
Complexity-
Authoritativ
Relevancy
Organization 0=No, 1=Yes
«Level-
1-
Complexity-
Organization»
«Level-
2-
Complexity-
Organization»
«Level-
3-
Complexity-
Organization»
«Level-
4-
Complexity-
Organization»
«Level-
5-
Complexity-
Organization»
«Level-
6-
Complexity-
Organization»
Public News 0=No, 1=Yes
«Level-
1-
Complexity-
Public-
News»
«Level-
2-
Complexity-
Public-
News»
«Level-
3-
Complexity-
Public-
News»
«Level-
4-
Complexity-
Public-
News»
«Level-
5-
Complexity-
Public-
News»
«Level-
6-
Complexity-
Public-
News»
Behavior
Fear No Score
«Level-
1-
Complexity-
Fear»
«Level-
2-
Complexity-
Fear»
«Level-
3-
Complexity-
Fear»
«Level-
4-
Complexity-
Fear»
«Level-
5-
Complexity-
Fear»
«Level-
6-
Complexity-
Fear»
Duty or
Obligation
No Score
«Level-
1-
Complexity-
Duty-
Obligation»
«Level-
2-
Complexity-
Duty-
Obligation»
«Level-
3-
Complexity-
Duty-
Obligation»
«Level-
4-
Complexity-
Duty-
Obligation»
«Level-
5-
Complexity-
Duty-
Obligation»
«Level-
6-
Complexity-
Duty-
Obligation»
Curiosity No Score
«Level-
1-
Complexity-
Curiosity»
«Level-
2-
Complexity-
Curiosity»
«Level-
3-
Complexity-
Curiosity»
«Level-
4-
Complexity-
Curiosity»
«Level-
5-
Complexity-
Curiosity»
«Level-
6-
Complexity-
Curiosity»
Greed No Score
«Level-
1-
Complexity-
Greed»
«Level-
2-
Complexity-
Greed»
«Level-
3-
Complexity-
Greed»
«Level-
4-
Complexity-
Greed»
«Level-
5-
Complexity-
Greed»
«Level-
6-
Complexity-
Greed»
Total 1 2 3 4 5 6
Page | 27 of 37
Templates
Level 1 Template
Subject: «Level-1-Subject»
Key characteristics: «Level-1-Key-Characteristics»
Total emails sent: «Level-1-Emails-Sent-Attempted»
Unique targets who clicked: «Level-1-User-Clicks»
Unique click rate: «Level-1-Click-Rate»%
Total clicks: «Level-1-Total-Clicks»
Total user reports: «Level-1-User-Reports»
Report rate: «Level-1-User-Report-Rate»
Page | 28 of 37
Level 2 Template
Subject: «Level-2-Subject»
Key characteristics: «Level-2-Key-Characteristics»
Total emails sent: «Level-2-Emails-Sent-Attempted»
Unique targets who clicked: «Level-2-User-Clicks»
Unique click rate: «Level-2-Click-Rate»%
Total clicks:
«Level-2-Total-Clicks»
Total user reports: «Level-2-User-Reports»
Report rate: «Level-2-User-Report-Rate»
Page | 29 of 37
Level 3 Template
Subject: «Level-3-Subject»
Key characteristics: «Level-3-Key-Characteristics»
Total emails sent: «Level-3-Emails-Sent-Attempted»
Unique targets who clicked: «Level-3-User-Clicks»
Unique click rate: «Level-3-Click-Rate»%
Total clicks: «Level-3-Total-Clicks»
Total user reports: «Level-3-User-Reports»
Report rate: «Level-3-User-Report-Rate»
Page | 30 of 37
Level 4 Template
Subject: «Level-4-Subject»
Key characteristics: «Level-4-Key-Characteristics»
Total emails sent: «Level-4-Emails-Sent-Attempted»
Unique targets who clicked: «Level-4-User-Clicks»
Unique click rate: «Level-4-Click-Rate»%
Total clicks: «Level-4-Total-Clicks»
Total user reports: «Level-4-User-Reports»
Report rate: «Level-4-User-Report-Rate»
Page | 31 of 37
Level 5 Template
Subject: «Level-5-Subject»
Key characteristics: «Level-5-Key-Characteristics»
Total emails sent: «Level-5-Emails-Sent-Attempted»
Unique targets who clicked: «Level-5-User-Clicks»
Unique click rate: «Level-5-Click-Rate»%
Total clicks: «Level-5-Total-Clicks»
Total user reports: «Level-5-User-Reports»
Report rate: «Level-5-User-Report-Rate»
Page | 32 of 37
Level 6 Template
Subject: «Level-6-Subject»
Key characteristics: «Level-6-Key-Characteristics»
Total emails sent: «Level-6-Emails-Sent-Attempted»
Unique targets who clicked: «Level-6-User-Clicks»
Unique click rate: «Level-6-Click-Rate»%
Total clicks: «Level-6-Total-Clicks»
Total user reports: «Level-6-User-Reports»
Report rate: «Level-6-User-Report-Rate»
Page | 33 of 37
Appendix D: Landing/Re-direct Page
«Landing-URL»
Page | 34 of 37
Appendix E: Top Phishing Techniques & Scams
In addition to training on the red flags and sophisticated techniques email users were most susceptible
to, CCA recommends anti-phishing training and awareness include the most common phishing techniques and
scams used to elicit clicks from target users in real world phishing incidents. A phishing technique is an element
within a larger phishing scam.
Techniques
A fake link. The context of the email can be anything, but the key technique is using a link that looks like
a real URL (or hyperlinked text) that the target has used and trusted before. Underlying it, however, is the
actual URL to a different and potentially malicious site. Hovering over (when on a computer) or a long
press (when on a mobile) reveals the true underlying link. Unexpected URLs should not be clicked. Even
if it matches expectations, the user should type it directly into their browser.
A fake website. Hovering over or long pressing the displayed link will reveal a matching URL. However,
closer inspection of the URL would reveal slight differences between the displayed address and real ad-
dress it’s imitating. Examples of slight differences include replacing a lower case ‘l’ with a ‘1’, adding
extra letters, or changing the domain such as from .com to .us. A few seconds’ pause to override autopilot
clicking could save a target from being phished.
A malicious attachment. Just like the fake website, closer inspection of any attachments sent may
reveal discrepancies between what is there versus what is expected. Questions that should be asked
with each email attachment include: was it an expected email? Is the sender known? Does their email
address match their display name? Does the attachment have the expected file type? If a target does
open the attachment, a document may then show a notice that “editing must be enabled to see content.”
If enabled, macros and other malicious code are able to execute on the target’s computer. Organizations
should have procedures for ensuring suspicious attachments are safe through use of file checkers or
other secure means.
Fake contact information. This technique is usually used to circumvent online portals or other estab-
lished procedures that normally handle change requests. They provide the target with an email or phone
numbers out of “generous convenience” so the target doesn’t have to spend time looking up that infor-
mation. Rather than believing the email, the person should call the number provided to them previously
such as the customer service number on the back of their card, office phone number in a directory, or on
the organization’s site the person navigates to separately.
Scams
Business Email Compromise (CEO Fraud) . This scam has cost organizations nearly $26 billion since
coming on the scene in 2013 according to FBI reports. A company’s email infrastructure is hacked or
spoofed and someone pretending to be the CEO tells their phishing target that they need help doing an
urgent wire transfer. Sometimes the target is threatened with job loss or public shaming if they don’t
comply. Setting and following established protocols that require a verbal phone call to confirm the trans-
fer or having an additional person to submit the request are two ways to block this type of scam from
succeeding.
Business Email Compromise (Vendor Product Invoice). Similar to the CEO fraud scam, a supplier’s
email has been hacked or spoofed and the attacker uses it to send a company a fraudulent invoice. Having
Page | 35 of 37
clear processes in place to confirm vendor change requests or invoice processing is one way of stopping
this scam from working.
Stolen/Leaked passwords and blackmail. The information used to craft these emails come from
the hundreds of millions of exposed online records. The attackers display a paired email and password
saying their possession of these credentials is “proof” that the attacker has already hacked the victim’s
computer. In exchange for not sharing some type of incriminating, embarrassing, or scandalous content
with the target’s email contacts, the attacker asks for a bribe—usually in the form of crypto currency. Even
though the hacking claims are false, this type of blackmail is a quickly rising scam because people are
too embarrassed to report it.
Security incident alert with offer to help. This scam uses the fake contact information or fake web-
site techniques in their ploy to first get the target to panic and then trust the attacker as the quickest so-
lution to their problem. Asking for money. Usually long-winded stories of woe or sudden inheritance that
only require a small sum of money on target’s part to transfer a huge sum of funds. Usually in exchange
for the “help,” the attacker promises to provide the target a decent portion of the funds. This scheme is
designed to be outlandish because it’s self-filtering. Attackers know only the most trusting or gullible peo-
ple will fall for this scam. This is intentional because the target is less likely to question further requests
as the scam continues.
Page | 36 of 37
Appendix F: Acronyms
Table 15: Email reconnaissance source descriptions
CISA Cybersecurity and Infrastructure Security Agency
CCA CISA Cyber Assessments
PCA Phishing Campaign Assessment
POC Point of Contact
ROE Rules of Engagement
Page | 37 of 37